Close

1. Identity statement
Reference TypeConference Paper (Conference Proceedings)
Sitesibgrapi.sid.inpe.br
Identifier8JMKD3MGPEW34M/47QSJL5
Repositorysid.inpe.br/sibgrapi/2022/10.17.08.56
Last Update2022:10.17.08.56.54 (UTC) dioognei@gmail.com
Metadata Repositorysid.inpe.br/sibgrapi/2022/10.17.08.56.54
Metadata Last Update2023:05.23.04.20.43 (UTC) administrator
Citation KeyMatosNasc:2022:MuApAc
TitleMusical Hyperlapse: A Multimodal Approach to Accelerate First-Person Videos
Short TitleMusical Hyperlapse: A Multimodal Approach to Accelerate First-Person Videos
FormatOn-line
Year2022
Access Date2024, May 19
Number of Files1
Size865 KiB
2. Context
Author1 Matos, Diognei
2 Nascimento, Erickson R.
Affiliation1 Federal University of Minas Gerais
2 Federal University of Minas Gerais
e-Mail Addressdioognei@gmail.com
Conference NameConference on Graphics, Patterns and Images, 35 (SIBGRAPI)
Conference LocationNatal, RN
Date24-27 Oct. 2022
Book TitleProceedings
Tertiary TypeMaster's or Doctoral Work
History (UTC)2022-10-17 08:56:54 :: dioognei@gmail.com -> administrator ::
2023-05-23 04:20:43 :: administrator -> :: 2022
3. Content and structure
Is the master or a copy?is the master
Content Stagecompleted
Transferable1
Keywordscomputer vision
music emotion recognition
image emotion recognition
semantic hyperlapse
AbstractWith the advance in technology and social media usage, first-person recording videos has become a common habit. These videos are usually very long and tiring to watch, bringing the need to speed up them. Despite recent progress of fast-forward methods, they do not consider inserting background music in the videos, which could make them more enjoyable. This thesis presents a new method that creates accelerated videos and includes the background music keeping the same emotion induced by visual and acoustic modalities. Our approach is based on the automatic recognition of emotions induced by music and video contents and an optimization algorithm that maximizes the visual quality of the output video and seeks to match the similarity of the music and the video's emotions. Quantitative results show that our method achieves the best performance in matching emotion similarity while maintaining the visual quality of the output video compared with other literature methods. Visual results can be seen through the link: https://youtu.be/9ykQa9zhcz8.
Arrangementurlib.net > SDLA > Fonds > SIBGRAPI 2022 > Musical Hyperlapse: A Multimodal Approach to Accelerate First-Person Videos
doc Directory Contentaccess
source Directory Contentthere are no files
agreement Directory Content
agreement.html 17/10/2022 05:56 1.6 KiB 
4. Conditions of access and use
data URLhttp://urlib.net/ibi/8JMKD3MGPEW34M/47QSJL5
zipped data URLhttp://urlib.net/zip/8JMKD3MGPEW34M/47QSJL5
Languageen
Target FileMusical Hyperlapse WTD Paper.pdf
User Groupdioognei@gmail.com
Visibilityshown
5. Allied materials
Mirror Repositorysid.inpe.br/banon/2001/03.30.15.38.24
Next Higher Units8JMKD3MGPEW34M/495MHJ8
Citing Item Listsid.inpe.br/sibgrapi/2023/05.19.12.10 8
Host Collectionsid.inpe.br/banon/2001/03.30.15.38
6. Notes
Empty Fieldsarchivingpolicy archivist area callnumber contenttype copyholder copyright creatorhistory descriptionlevel dissemination documentstage doi edition editor electronicmailaddress group holdercode isbn issn label lineage mark nextedition notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project publisher publisheraddress readergroup readpermission resumeid rightsholder schedulinginformation secondarydate secondarykey secondarymark secondarytype serieseditor session sponsor subject tertiarymark type url versiontype volume


Close